Does SLOPE outperform bridge regression?

نویسندگان

چکیده

Abstract A recently proposed SLOPE estimator [6] has been shown to adaptively achieve the minimax $\ell _2$ estimation rate under high-dimensional sparse linear regression models [25]. Such optimality holds in regime where sparsity level $k$, sample size $n$ and dimension $p$ satisfy $k/p\rightarrow 0, k\log p/n\rightarrow 0$. In this paper, we characterize error of complementary both $k$ scale linearly with $p$, provide new insights into performance estimators. We first derive a concentration inequality for finite mean square (MSE) SLOPE. The quantity that MSE concentrates around takes complicated implicit form. With delicate analysis quantity, prove among all estimators, LASSO is optimal estimating $k$-sparse parameter vectors do not have tied nonzero components low noise scenario. On other hand, large scenario, family estimators are sub-optimal compared bridge such as Ridge estimator.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Does a Gibbs sampler approach to spatial Poisson regression models outperform a single site MH sampler?

In this paper we present and evaluate a Gibbs sampler for a Poisson regression model including spatial effects. The approach is based on Frühwirth-Schnatter and Wagner (2004b) who show that by data augmentation using the introduction of two sequences of latent variables a Poisson regression model can be transformed into an approximate normal linear model. We show how this methodology can be ext...

متن کامل

When Does a Pair Outperform Two Individuals?

This paper reports experimental measurements of productivity and quality in pair programming. The work complements Laurie Williams’ work on collaborative programming, in which Pair Programming and Solo Programming student groups wrote the same programs and then their activities were measured to investigate productivity, quality, etc. In this paper, Pair and Solo industrial programmer groups are...

متن کامل

Variational Bridge Regression

Here we obtain approximate Bayes inferences through variational methods when an exponential power family type prior is specified for the regression coefficients to mimic the characteristics of the Bridge regression. We accomplish this through hierarchical modeling of such priors. Although the mixing distribution is not explicitly stated for scale normal mixtures, we obtain the required moments ...

متن کامل

Robust and sparse bridge regression

It is known that when there are heavy-tailed errors or outliers in the response, the least squares methods may fail to produce a reliable estimator. In this paper, we proposed a generalized Huber criterion which is highly flexible and robust for large errors. We applied the new criterion to the bridge regression family, called robust and sparse bridge regression (RSBR). However, to get the RSBR...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Information and Inference: A Journal of the IMA

سال: 2021

ISSN: ['2049-8772', '2049-8764']

DOI: https://doi.org/10.1093/imaiai/iaab025